82 research outputs found

    Invariant Feature Mappings for Generalizing Affordance Understanding Using Regularized Metric Learning

    Full text link
    This paper presents an approach for learning invariant features for object affordance understanding. One of the major problems for a robotic agent acquiring a deeper understanding of affordances is finding sensory-grounded semantics. Being able to understand what in the representation of an object makes the object afford an action opens up for more efficient manipulation, interchange of objects that visually might not be similar, transfer learning, and robot to human communication. Our approach uses a metric learning algorithm that learns a feature transform that encourages objects that affords the same action to be close in the feature space. We regularize the learning, such that we penalize irrelevant features, allowing the agent to link what in the sensory input caused the object to afford the action. From this, we show how the agent can abstract the affordance and reason about the similarity between different affordances

    Probabilistic consolidation of grasp experience

    Get PDF
    We present a probabilistic model for joint representation of several sensory modalities and action parameters in a robotic grasping scenario. Our non-linear probabilistic latent variable model encodes relationships between grasp-related parameters, learns the importance of features, and expresses confidence in estimates. The model learns associations between stable and unstable grasps that it experiences during an exploration phase. We demonstrate the applicability of the model for estimating grasp stability, correcting grasps, identifying objects based on tactile imprints and predicting tactile imprints from object-relative gripper poses. We performed experiments on a real platform with both known and novel objects, i.e., objects the robot trained with, and previously unseen objects. Grasp correction had a 75% success rate on known objects, and 73% on new objects. We compared our model to a traditional regression model that succeeded in correcting grasps in only 38% of cases

    Neuromorphic tactile sensor array based on fiber Bragg gratings to encode object qualities

    Get PDF
    Emulating the sense of touch is fundamental to endow robotic systems with perception abilities. This work presents an unprecedented mechanoreceptor-like neuromorphic tactile sensor implemented with fiber optic sensing technologies. A robotic gripper was sensorized using soft and flexible tactile sensors based on Fiber Bragg Grating (FBG) transducers and a neuro-bio-inspired model to extract tactile features. The FBGs connected to the neuron model emulated biological mechanoreceptors in encoding tactile information by means of spikes. This conversion of inflowing tactile information into event-based spikes has an advantage of reduced bandwidth requirements to allow communication between sensing and computational subsystems of robots. The outputs of the sensor were converted into spiking on-off events by means of an architecture implemented in a Field Programmable Gate Array (FPGA) and applied to robotic manipulation tasks to evaluate the effectiveness of such information encoding strategy. Different tasks were performed with the objective to grant fine manipulation abilities using the features extracted from the grasped objects (i.e., size and hardness). This is envisioned to be a futuristic sensor technology combining two promising technologies: optical and neuromorphic sensing

    Tactile Sensing and Control of Robotic Manipulator Integrating Fiber Bragg Grating Strain-Sensor

    Get PDF
    Tactile sensing is an instrumental modality of robotic manipulation, as it provides information that is not accessible via remote sensors such as cameras or lidars. Touch is particularly crucial in unstructured environments, where the robot's internal representation of manipulated objects is uncertain. In this study we present the sensorization of an existing artificial hand, with the aim to achieve fine control of robotic limbs and perception of object's physical properties. Tactile feedback is conveyed by means of a soft sensor integrated at the fingertip of a robotic hand. The sensor consists of an optical fiber, housing Fiber Bragg Gratings (FBGs) transducers, embedded into a soft polymeric material integrated on a rigid hand. Through several tasks involving grasps of different objects in various conditions, the ability of the system to acquire information is assessed. Results show that a classifier based on the sensor outputs of the robotic hand is capable of accurately detecting both size and rigidity of the operated objects (99.36 and 100% accuracy, respectively). Furthermore, the outputs provide evidence of the ability to grab fragile objects without breakage or slippage e and to perform dynamic manipulative tasks, that involve the adaptation of fingers position based on the grasped objects' condition

    Editorial: Robotic In-Situ Servicing, Assembly and Manufacturing

    Get PDF
    This research topic is dedicated to articles focused on robotic manufacturing, assembly, and servicing utilizing in-situ resources, especially for space robotic applications. The purpose was to gather resource material for researchers from a variety of disciplines to identify common themes, formulate problems, and share promising technologies for autonomous large-scale construction, servicing, and assembly robots. The articles under this special topic provide a snapshot of several key technologies under development to support on-orbit robotic servicing, assembly, and manufacturing
    • …
    corecore